AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Long-Chain Thinking Optimization

# Long-Chain Thinking Optimization

Deepthought MOE 8X3B R1 Llama 3.2 Reasoning 18B Gguf
Apache-2.0
An 8X3B Mixture of Experts model with 4/8 experts activated, each equipped with reasoning techniques. Total parameters are 24B, but the actual model size is only 18.4B. Suitable for both creative and non-creative use cases as well as general purposes.
Large Language Model English
D
DavidAU
148
7
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase